Space-Efficient Latent Contracts

نویسنده

  • Michael Greenberg
چکیده

Standard higher-order contract monitoring breaks tail recursion and leads to space leaks that can change a program’s asymptotic complexity; space-efficiency restores tail recursion and bounds the amount of space used by contracts. Space-efficient contract monitoring for contracts enforcing simple type disciplines (a/k/a gradual typing) is well studied. Prior work establishes a space-efficient semantics for manifest contracts without dependency [11]; we adapt that work to a latent calculus with dependency. We guarantee space efficiency when no dependency is used; we cannot generally guarantee space efficiency when dependency is used, but instead offer a framework for making such programs space efficient on a case-by-case basis.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unsupervised Learning with Imbalanced Data via Structure Consolidation Latent Variable Model

Unsupervised learning on imbalanced data is challenging because, when given imbalanced data, current model is often dominated by the major category and ignores the categories with small amount of data. We develop a latent variable model that can cope with imbalanced data by dividing the latent space into a shared space and a private space. Based on Gaussian Process Latent Variable Models, we pr...

متن کامل

Density estimation using Real NVP

Unsupervised learning of probabilistic models is a central yet challenging problem in machine learning. Specifically, designing models with tractable learning, sampling, inference and evaluation is crucial in solving this task. We extend the space of such models using real-valued non-volume preserving (real NVP) transformations, a set of powerful, stably invertible, and learnable transformation...

متن کامل

Sparse Latent Semantic Analysis

Latent semantic analysis (LSA), as one of the most popular unsupervised dimension reduction tools, has a wide range of applications in text mining and information retrieval. The key idea of LSA is to learn a projection matrix that maps the high dimensional vector space representations of documents to a lower dimensional latent space, i.e. so called latent topic space. In this paper, we propose ...

متن کامل

Kernel Topic Models

Latent Dirichlet Allocation models discrete data as a mixture of discrete distributions, using Dirichlet beliefs over the mixture weights. We study a variation of this concept, in which the documents’ mixture weight beliefs are replaced with squashed Gaussian distributions. This allows documents to be associated with elements of a Hilbert space, admitting kernel topic models (KTM), modelling te...

متن کامل

Elimination of latent variables in behavioral systems: a geometric approach ⋆

This paper considers computational aspects of the problem of elimination of variables. More precisely, the problem of elimination of latent variables in models is addressed in the behavioral framework. In earlier contributions on this theme, the model classes of infinitely smooth C and square integrable L2 linear time-invariant systems have been considered. For both system classes, conditions f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1604.02474  شماره 

صفحات  -

تاریخ انتشار 2016